Affective computing in general and human activity and intention analysis in particular\ncomprise a rapidly-growing field of research. Head pose and emotion changes present serious\nchallenges when applied to player�s training and ludology experience in serious games, or analysis of\ncustomer satisfaction regarding broadcast and web services, or monitoring a driver�s attention.\nGiven the increasing prominence and utility of depth sensors, it is now feasible to perform\nlarge-scale collection of three-dimensional (3D) data for subsequent analysis. Discriminative random\nregression forests were selected in order to rapidly and accurately estimate head pose changes in an\nunconstrained environment. In order to complete the secondary process of recognising four universal\ndominant facial expressions (happiness, anger, sadness and surprise), emotion recognition via facial\nexpressions (ERFE) was adopted. After that, a lightweight data exchange format (JavaScript Object\nNotation (JSON)) is employed, in order to manipulate the data extracted from the two aforementioned\nsettings. Motivated by the need to generate comprehensible visual representations from different sets\nof data, in this paper, we introduce a system capable of monitoring human activity through head\npose and emotion changes, utilising an affordable 3D sensing technology (Microsoft Kinect sensor).
Loading....